Training MLP Network Using Scatter Search as a Global Optimizer

نویسندگان

  • Heikki Maaranen
  • Kaisa Miettinen
  • Marko M. Mäkelä
چکیده

Many real life optimization problems are nonconvex and may have several local minima within their feasible region. Therefore, global search methods are needed. Metaheuristics are efficient global optimizers including a metastrategy that guides a heuristic search. Genetic algorithms, simulated annealing, tabu search and scatter search are the most well-know metaheuristics. In general, they do not require optimization problems to be differentiable or feasible regions to be connected. Hence metaheuristics are applicable to a large number of problems.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

آموزش شبکه عصبی MLP در فشرده‌سازی تصاویر با استفاده از روش GSA

Image compression is one of the important research fields in image processing. Up to now, different methods are presented for image compression. Neural network is one of these methods that has represented its good performance in many applications. The usual method in training of neural networks is error back propagation method that its drawbacks are late convergence and stopping in points of lo...

متن کامل

Bounding the Search Space for Global Optimization of Neural Networks Learning Error: An Interval Analysis Approach

Training a multilayer perceptron (MLP) with algorithms employing global search strategies has been an important research direction in the field of neural networks. Despite a number of significant results, an important matter concerning the bounds of the search region— typically defined as a box—where a global optimization method has to search for a potential global minimizer seems to be unresol...

متن کامل

G - Prop - III : Global Optimization of Multilayer Perceptrons using anEvolutionary

This paper proposes a new version of a method (G-Prop-III, genetic backpropaga-tion) that attempts to solve the problem of nding appropriate initial weights and learning parameters for a single hidden layer Mul-tilayer Perceptron (MLP) by combining a genetic algorithm (GA) and backpropagation (BP). The GA selects the initial weights and the learning rate of the network, and changes the number o...

متن کامل

Novel Mlp Neural Network with Hybrid Tabu Search Algorithm

In this paper, we propose a new global and fast Multilayer Perceptron Neural Network (MLP-NN) which can be used to forecast the automotive price. Nowadays, the gradient-based techniques, such as back propagation, are widely used for training neural networks. These techniques have local convergence results and, therefore, can perform poorly even on simple problems when forecasting is out of samp...

متن کامل

Multi-layer Perceptron Error Surfaces: Visualization, Structure and Modelling

The Multi-Layer Perceptron (MLP) is one of the most widely applied and researched Artificial Neural Network model. MLP networks are normally applied to performing supervised learning tasks, which involve iterative training methods to adjust the connection weights within the network. This is commonly formulated as a multivariate non-linear optimization problem over a very high-dimensional space ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001